Job Title: Senior Data Engineer
Duration: 6 months
Location: Salt lake City Utah - Remote
Must Have
- CI/CD tools
- Git/Github
- ADO
- Python Developer
- Experience with Docker, Kubernetes
- Apache Kafka
- Automated Testing
- BDD Behavior Driven Development
- JSON
- API Development
As a Data Engineer, you’ll provide your talents in contributing to the success of the s team by delivering the following:
- Responsible for developing and deploying enterprise grade platforms that enable data-driven solutions.
- Automate and maintain scalable infrastructure.
- Ensures delivery of a highly available and scalable systems.
- Monitor all systems and applications and ensure optimal performance.
- Research and test new tools and applications.
- Analyzes and designs technical solutions to address business needs.
- Participate in troubleshooting applications and systems issues.
- Identifies, investigates and proposes solutions to technical problems.
- Develops, tests, and modifies software to improve efficiency of data platforms and applications. Provides technical support for issues.
- Monitors system performance to maintain consistent up time.
- Provides technical expertise where required within Information Technology Projects.
- Prepares and maintains necessary documentation.
- Participate in regular agile ceremonies, such as program increment planning, daily standups, team backlog grooming, iteration retrospectives, team demos and inspect & adapt, etc.
- Keep up with the committed features and stories in the team backlog.
- Work with the project manager/scrum master to remove impediments.
- Serve in the goalie rotation to support Production.
- Support test and QA efforts on the various data projects.
- Coordinate with data operations teams to deploy changes into production.
- Highest level may function as a lead.
- Other duties as assigned.
Qualifications
- Requires a Bachelor's in Computer Science, Computer Engineering or related field and some experience ADO/GIT, ETL, SQL, UNIX/Linux, Docker/Kubernetes, API Development, JSON, Kafka, Automated Testing, BDD, Big Data distributed systems, various programming languages like Java and Python, orchestration tools and processes or other directly related experience.
- A combination of education and experience may meet qualifications.
- Basic knowledge of search technology, building real-time data pipelines and various programming languages like Java and Python. Knowledge of ETL, UNIX/Linux, scheduling and orchestration tools and processes.
- Good analytical, organizational and problem-solving skills.
- Ability and desire to learn new technologies quickly.
- Ability to elicit, gather and analyze user requirements.
- Ability to work independently and collaborate with others at all levels of technical understanding.
- Able to meet deadlines.
- Good judgment and project management skills.
- Ability to communicate both verbally and in writing with both technical and non-technical staff.
- Ability to work in a team environment and have good interpersonal skills.
- Ability to adapt to changing technology and priorities.
- Must be able to work independently, handle multiple concurrent projects, with an ability to prioritize and manage projects effectively.
- Must be able to interpret, validate and map business requirements to an appropriate solution.